Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization

نویسندگان

  • Ahmet Alacaoglu
  • Quoc Tran-Dinh
  • Olivier Fercoq
  • Volkan Cevher
چکیده

We propose a new randomized coordinate descent method for a convex optimization template with broad applications. Our analysis relies on a novel combination of four ideas applied to the primal-dual gap function: smoothing, acceleration, homotopy, and coordinate descent with non-uniform sampling. As a result, our method features the first convergence rate guarantees among the coordinate descent methods, that are the best-known under a variety of common structure assumptions on the template. We provide numerical evidence to support the theoretical results with a comparison to state-of-the-art algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Coordinate Descent Methods for Regularized Smooth and Nonsmooth Losses

Stochastic Coordinate Descent (SCD) methods are among the first optimization schemes suggested for efficiently solving large scale problems. However, until now, there exists a gap between the convergence rate analysis and practical SCD algorithms for general smooth losses and there is no primal SCD algorithm for nonsmooth losses. In this paper, we discuss these issues using the recently develop...

متن کامل

Duality between subgradient and conditional gradient methods

Given a convex optimization problem and its dual, there are many possible firstorder algorithms. In this paper, we show the equivalence between mirror descent algorithms and algorithms generalizing the conditional gradient method. This is done through convex duality and implies notably that for certain problems, such as for supervised machine learning problems with nonsmooth losses or problems ...

متن کامل

Primal and dual predicted decrease approximation methods

We introduce the notion of predicted decrease approximation (PDA) for constrained convex optimization, a flexible framework which includes as special cases known algorithms such as generalized conditional gradient, proximal gradient, greedy coordinate descent for separable constraints and working set methods for linear equality constraints with bounds. The new scheme allows the development of a...

متن کامل

Smooth Optimization Approach for Covariance Selection

In this paper we study a smooth optimization approach for solving a class of nonsmooth strongly concave maximization problems. In particular, we apply Nesterov’s smooth optimization technique [17, 18] to their dual counterparts that are smooth convex problems. It is shown that the resulting approach has O(1/√ǫ) iteration complexity for finding an ǫ-optimal solution to both primal and dual probl...

متن کامل

Stochastic Coordinate Descent for Nonsmooth Convex Optimization

Stochastic coordinate descent, due to its practicality and efficiency, is increasingly popular in machine learning and signal processing communities as it has proven successful in several large-scale optimization problems , such as l1 regularized regression, Support Vector Machine, to name a few. In this paper, we consider a composite problem where the nonsmoothness has a general structure that...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017